statistical information - перевод на русский
Diclib.com
Словарь ChatGPT
Введите слово или словосочетание на любом языке 👆
Язык:

Перевод и анализ слов искусственным интеллектом ChatGPT

На этой странице Вы можете получить подробный анализ слова или словосочетания, произведенный с помощью лучшей на сегодняшний день технологии искусственного интеллекта:

  • как употребляется слово
  • частота употребления
  • используется оно чаще в устной или письменной речи
  • варианты перевода слова
  • примеры употребления (несколько фраз с переводом)
  • этимология

statistical information - перевод на русский

WAY OF MEASURING THE AMOUNT OF INFORMATION THAT AN OBSERVABLE RANDOM VARIABLE CARRIES ABOUT AN UNKNOWN PARAMETER OF A DISTRIBUTION THAT MODELS IT
Information (statistics); Extreme physical information; Fisher information matrix; Information number; Fisher amount of information; Fisher matrix; Information matrix; Fisherian criterion; Fisher's information; Singular statistical model; Fisher Information
Найдено результатов: 1679
statistical information      

математика

статистические данные

statistical information      
статистические данные
statistical ensemble         
  • classical]] systems in [[phase space]] (top). Each system consists of one massive particle in a one-dimensional [[potential well]] (red curve, lower figure). The initially compact ensemble becomes swirled up over time.
  • Visual representation of five statistical ensembles (from left to right): [[microcanonical ensemble]], [[canonical ensemble]], [[grand canonical ensemble]], [[isobaric-isothermal ensemble]], [[isoenthalpic-isobaric ensemble]]
SET OF POSSIBLE STATES
Using statistical ensembles; Using Statistical Ensembles; Ensemble average; Statistical ensemble; Thermodynamic ensemble; Gibbsian ensemble; Statistical Ensemble; Ensemble averaging (statistical mechanics); Ensemble average (statistical mechanics); Statistical ensemble (mathematical physics)

математика

статистическая совокупность

статистический ансамбль

information science         
  • Visualisation of various methodological approaches to gaining insights from meta data areas.
  • [[Gottfried Wilhelm Leibniz]], a German [[polymath]] who wrote primarily in Latin and French. His fields of study were [[Metaphysics]], [[Mathematics]], [[Theodicy]].
  • [[Joseph Marie Jacquard]]
  • [[Vannevar Bush]], a famous information scientist, ca. 1940–1944
FIELD PRIMARILY CONCERNED WITH THE ANALYSIS, COLLECTION, CLASSIFICATION, MANIPULATION, STORAGE, RETRIEVAL AND DISSEMINATION OF INFORMATION
Information Science; Information sciences; Information Sciences; Infosci; Information studies; Information Studies; Documentation (field); Information Science and Engineering; Abstracting information; Informatically; History of information science; Informational science; Informational sciences; Informational study; Informational studies; Information science and engineering

[infə'meiʃ(ə)n'saiəns]

синоним

informatics

information theory         
  • ''H''<sub>b</sub>(''p'')}}.  The entropy is maximized at 1 bit per trial when the two possible outcomes are equally probable, as in an unbiased coin toss.
  • A picture showing scratches on the readable surface of a CD-R.  Music and data CDs are coded using error correcting codes and thus can still be read even if they have minor scratches using [[error detection and correction]].
MATHEMATICAL THEORY FROM THE FIELD OF PROBABILITY THEORY AND STATISTICS
Information Theory; Classical information theory; Shannon theory; Information theorist; Shannon information theory; Semiotic information theory; Semiotic information; Information-theoretic; Shannons theory; Shannon's information theory; Applications of information theory
information theory теория информации
information theory         
  • ''H''<sub>b</sub>(''p'')}}.  The entropy is maximized at 1 bit per trial when the two possible outcomes are equally probable, as in an unbiased coin toss.
  • A picture showing scratches on the readable surface of a CD-R.  Music and data CDs are coded using error correcting codes and thus can still be read even if they have minor scratches using [[error detection and correction]].
MATHEMATICAL THEORY FROM THE FIELD OF PROBABILITY THEORY AND STATISTICS
Information Theory; Classical information theory; Shannon theory; Information theorist; Shannon information theory; Semiotic information theory; Semiotic information; Information-theoretic; Shannons theory; Shannon's information theory; Applications of information theory

общая лексика

теория информации

научная дисциплина

Смотрите также

information

information model         
  • Database requirements for a CD collection in [[EXPRESS-G]] notation.
  • ER diagram]].
REPRESENTATION OF CONCEPTUAL RELATIONSHIPS BETWEEN THINGS
Information models; Information modeling; Information Modelling; Information modelling; Information modeling language

общая лексика

информационная модель

Смотрите также

conceptual model; essential model; model

statistical law         
STATISTICAL TENDENCY THAT OCCURS IN A BROAD RANGE OF DATASETS
Statistical law; Law of statistics
закон распределения
information theory         
  • ''H''<sub>b</sub>(''p'')}}.  The entropy is maximized at 1 bit per trial when the two possible outcomes are equally probable, as in an unbiased coin toss.
  • A picture showing scratches on the readable surface of a CD-R.  Music and data CDs are coded using error correcting codes and thus can still be read even if they have minor scratches using [[error detection and correction]].
MATHEMATICAL THEORY FROM THE FIELD OF PROBABILITY THEORY AND STATISTICS
Information Theory; Classical information theory; Shannon theory; Information theorist; Shannon information theory; Semiotic information theory; Semiotic information; Information-theoretic; Shannons theory; Shannon's information theory; Applications of information theory
теория информации; математическая теория, основанная К. Шенноном, которая используется для интерпретации процесса передачи сообщений в системах связи.
rejection region         
METHOD OF STATISTICAL INFERENCE
Hypothesis testing; Testing statistical hypotheses; Hypothesis test; Critical region; Statistical study; Significance test; Statistical tests; Positive data; Acceptance region; Rejection region; Tests of significance; Test (statistics); Statistical hypothesis; Statistical bias detection; Statistical significance test; Significance testing; Confirmatory data analysis; Hypotesting; Statistical hypothesis test; Common test statistics; Testing Hypotheses; Region of rejection; Statistical technique; Null hypothesis significance testing; Significant difference testing; Statistical difference testing; Positive result; Statistical test; Standard test statistics; Hypothesis Testing; Test of significance; Statistical threshold; Null-hypothesis significance-testing; Do not reject; Criticism of statistical hypothesis testing; Simple hypothesis; Statistical hypothesis testing controversy; Null hypothesis testing

общая лексика

критическая область

область неприятия гипотезы

область непринятия гипотезы

Определение

Поиск информационный

Википедия

Fisher information

In mathematical statistics, the Fisher information (sometimes simply called information) is a way of measuring the amount of information that an observable random variable X carries about an unknown parameter θ of a distribution that models X. Formally, it is the variance of the score, or the expected value of the observed information.

The role of the Fisher information in the asymptotic theory of maximum-likelihood estimation was emphasized by the statistician Ronald Fisher (following some initial results by Francis Ysidro Edgeworth). The Fisher information matrix is used to calculate the covariance matrices associated with maximum-likelihood estimates. It can also be used in the formulation of test statistics, such as the Wald test.

In Bayesian statistics, the Fisher information plays a role in the derivation of non-informative prior distributions according to Jeffreys' rule. It also appears as the large-sample covariance of the posterior distribution, provided that the prior is sufficiently smooth (a result known as Bernstein–von Mises theorem, which was anticipated by Laplace for exponential families). The same result is used when approximating the posterior with Laplace's approximation, where the Fisher information appears as the covariance of the fitted Gaussian.

Statistical systems of a scientific nature (physical, biological, etc.) whose likelihood functions obey shift invariance have been shown to obey maximum Fisher information. The level of the maximum depends upon the nature of the system constraints.

Как переводится statistical information на Русский язык